19 research outputs found

    Ultrasonic Waves in Water Visualized With Schlieren Imaging

    Get PDF
    The Acoustic Liquid Manipulation project at the NASA Glenn Research Center at Lewis Field is working with high-intensity ultrasound waves to produce acoustic radiation pressure and acoustic streaming. These effects can be used to propel liquid flows to manipulate floating objects and liquid surfaces. Interest in acoustic liquid manipulation has been shown in acoustically enhanced circuit board electroplating, microelectromechanical systems (MEMS), and microgravity space experiments. The current areas of work on this project include phased-array ultrasonic beam steering, acoustic intensity measurements, and schlieren imaging of the ultrasonic waves

    Reannealed Fiber Bragg Gratings Demonstrated High Repeatability in Temperature Measurements

    Get PDF
    Fiber Bragg gratings (FBGs) are formed by periodic variations of the refractive index of an optical fiber. These periodic variations allow an FBG to act as an embedded optical filter, passing the majority of light propagating through a fiber while reflecting back a narrow band of the incident light. The peak reflected wavelength of the FBG is known as the Bragg wavelength. Since the period and width of the refractive index variation in the fiber determines the wavelengths that are transmitted and reflected by the grating, any force acting on the fiber that alters the physical structure of the grating will change the wavelengths that are transmitted and reflected by it. Both thermal and mechanical forces acting on the grating will alter its physical characteristics, allowing the FBG sensor to detect both the temperature variations and the physical stresses and strains placed upon it. This ability to sense multiple physical forces makes the FBG a versatile sensor. To assess the feasibility of using Bragg gratings as temperature sensors for propulsion applications, researchers at the NASA Glenn Research Center evaluated the performance of Bragg gratings at elevated temperatures for up to 300 C. For these purposes, commercially available polyimide-coated high-temperature gratings were used that were annealed by the manufacturer to 300 C. To assure the most thermally stable gratings at the operating temperatures, we reannealed the gratings to 400 C at a very slow rate for 12 to 24 hr until their reflected optical powers were stabilized. The reannealed gratings were then subjected to periodic thermal cycling from room temperature to 300 C, and their peak reflected wavelengths were monitored. The setup shown is used for reannealing and thermal cycling the FBGs. Signals from the photodetectors and the spectrum analyzer were fed into a computer equipped with LabVIEW software. The software synchronously monitored the oven/furnace temperature and the optical spectrum analyzer as well as processed the data. Experimental results presented in the following graph show typical wavelength versus temperature dependence of a reannealed FBG through six thermal cycles (80 hr). The average standard deviation of the temperature-to-wavelength relationship ranged from 1.86 to 2.92 C over the six thermal cycles each grating was subjected to. This is an error of less than 1.0 percent of full scale throughout the entire evaluation temperature range from ambient to 300 C

    High-Temperature Optical Sensor

    Get PDF
    A high-temperature optical sensor (see Figure 1) has been developed that can operate at temperatures up to 1,000 C. The sensor development process consists of two parts: packaging of a fiber Bragg grating into a housing that allows a more sturdy thermally stable device, and a technological process to which the device is subjected to in order to meet environmental requirements of several hundred C. This technology uses a newly discovered phenomenon of the formation of thermally stable secondary Bragg gratings in communication-grade fibers at high temperatures to construct robust, optical, high-temperature sensors. Testing and performance evaluation (see Figure 2) of packaged sensors demonstrated operability of the devices at 1,000 C for several hundred hours, and during numerous thermal cycling from 400 to 800 C with different heating rates. The technology significantly extends applicability of optical sensors to high-temperature environments including ground testing of engines, flight propulsion control, thermal protection monitoring of launch vehicles, etc. It may also find applications in such non-aerospace arenas as monitoring of nuclear reactors, furnaces, chemical processes, and other hightemperature environments where other measurement techniques are either unreliable, dangerous, undesirable, or unavailable

    System Developed for Bulk Flow Imaging of a Two-Phase Fluid in a Cylindrical Couette

    Get PDF
    The Microgravity Observation of Bubble Interactions (MOBI) experiment is working to better understand the physics of gas-liquid suspensions. To study such suspensions, researchers generate bubbles in a large cylindrical flow channel. Then, they use various types of instrumentation, including video imaging, to study the bubbly suspension. Scientists will need a camera view of the majority of the gas-liquid suspension inside of the couette in order to gather the information needed from the MOBI experiment. This will provide the scientists with a qualitative picture of the flow that may indicate flow instabilities or imperfect axial mixing inside the couette. These requirements pose a significant challenge because the imaging and lighting system must be confined to a very tight space since the space available on the International Space Station experiment racks is very limited. In addition, because of the large field of view needed and the detail needed to see the gas-liquid suspension behavior in the image, a digital video camera with high resolution (1024 by 1024 pixels) had to be used. Although the high-resolution camera will provide scientists with the image quality they need, it left little space on the experiment rack for the lighting system. Many configurations were considered for the lighting system, including front-lighting and back-lighting, but because of mechanical design limitations with the couette, back-lighting was not an option

    Component-Level Electronic-Assembly Repair (CLEAR) Analysis of the Problem Reporting and Corrective Action (PRACA) Database of the International Space Station On-Orbit Electrical Systems

    Get PDF
    The NASA Constellation Program is investigating and developing technologies to support human exploration of the Moon and Mars. The Component-Level Electronic-Assembly Repair (CLEAR) task is part of the Supportability Project managed by the Exploration Technology Development Program. CLEAR is aimed at enabling a flight crew to diagnose and repair electronic circuits in space yet minimize logistics spares, equipment, and crew time and training. For insight into actual space repair needs, in early 2008 the project examined the operational experience of the International Space Station (ISS) program. CLEAR examined the ISS on-orbit Problem Reporting and Corrective Action database for electrical and electronic system problems. The ISS has higher than predicted reliability yet, as expected, it has persistent problems. A goal was to identify which on-orbit electrical problems could be resolved by a component-level replacement. A further goal was to identify problems that could benefit from the additional diagnostic and test capability that a component-level repair capability could provide. The study indicated that many problems stem from a small set of root causes that also represent distinct component problems. The study also determined that there are certain recurring problems where the current telemetry instrumentation and built-in tests are unable to completely resolve the problem. As a result, the root cause is listed as unknown. Overall, roughly 42 percent of on-orbit electrical problems on ISS could be addressed with a component-level repair. Furthermore, 63 percent of on-orbit electrical problems on ISS could benefit from additional external diagnostic and test capability. These results indicate that in situ component-level repair in combination with diagnostic and test capability can be expected to increase system availability and reduce logistics. The CLEAR approach can increase the flight crew s ability to act decisively to resolve problems while reducing dependency on Earth-supplied logistics for future Constellation Program missions

    Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept

    Get PDF
    This Component-Level Electronic-Assembly Repair (CLEAR) Operational Concept document was developed as a first step in developing the Component-Level Electronic-Assembly Repair (CLEAR) System Architecture (NASA/TM-2011-216956). The CLEAR operational concept defines how the system will be used by the Constellation Program and what needs it meets. The document creates scenarios for major elements of the CLEAR architecture. These scenarios are generic enough to apply to near-Earth, Moon, and Mars missions. The CLEAR operational concept involves basic assumptions about the overall program architecture and interactions with the CLEAR system architecture. The assumptions include spacecraft and operational constraints for near-Earth orbit, Moon, and Mars missions. This document addresses an incremental development strategy where capabilities evolve over time, but it is structured to prevent obsolescence. The approach minimizes flight hardware by exploiting Internet-like telecommunications that enables CLEAR capabilities to remain on Earth and to be uplinked as needed. To minimize crew time and operational cost, CLEAR exploits offline development and validation to support online teleoperations. Operational concept scenarios are developed for diagnostics, repair, and functional test operations. Many of the supporting functions defined in these operational scenarios are further defined as technologies in NASA/TM-2011-216956

    Component-Level Electronic-Assembly Repair (CLEAR) System Architecture

    Get PDF
    This document captures the system architecture for a Component-Level Electronic-Assembly Repair (CLEAR) capability needed for electronics maintenance and repair of the Constellation Program (CxP). CLEAR is intended to improve flight system supportability and reduce the mass of spares required to maintain the electronics of human rated spacecraft on long duration missions. By necessity it allows the crew to make repairs that would otherwise be performed by Earth based repair depots. Because of practical knowledge and skill limitations of small spaceflight crews they must be augmented by Earth based support crews and automated repair equipment. This system architecture covers the complete system from ground-user to flight hardware and flight crew and defines an Earth segment and a Space segment. The Earth Segment involves database management, operational planning, and remote equipment programming and validation processes. The Space Segment involves the automated diagnostic, test and repair equipment required for a complete repair process. This document defines three major subsystems including, tele-operations that links the flight hardware to ground support, highly reconfigurable diagnostics and test instruments, and a CLEAR Repair Apparatus that automates the physical repair process

    Portable Unit for Metabolic Analysis

    Get PDF
    The Portable Unit for Metabolic Analysis (PUMA) is an instrument that measures several quantities indicative of human metabolic function. Specifically, this instrument makes time-resolved measurements of temperature, pressure, flow, and the partial pressures of oxygen and carbon dioxide in breath during both inhalation and exhalation. Portable instruments for measuring these quantities have been commercially available, but the response times of those instruments are too long to enable temporal resolution of phenomena on the time scales of human respiration cycles. In contrast, the response time of the PUMA is significantly shorter than characteristic times of human respiration phenomena, making it possible to analyze varying metabolic parameters, not only on sequential breath cycles but also at successive phases of inhalation and exhalation within the same breath cycle. In operation, the PUMA is positioned to sample breath near the subject s mouth. Commercial off-the-shelf sensors are used for three of the measurements: a miniature pressure transducer for pressure, a thermistor for temperature, and an ultrasonic sensor for flow. Sensors developed at Glenn Research Center are used for measuring the partial pressures of oxygen and carbon dioxide: The carbon dioxide sensor exploits the relatively strong absorption of infrared light by carbon dioxide. Light from an infrared source passes through the stream of inhaled or exhaled gas and is focused on an infrared- sensitive photodetector. The oxygen sensor exploits the effect of oxygen in quenching the fluorescence of ruthenium-doped organic molecules in a dye on the tip of an optical fiber. A blue laser diode is used to excite the fluorescence, and the optical fiber carries the fluorescent light to a photodiode, the temporal variation of the output of which bears a known relationship with the rate of quenching of fluorescence and, hence, with the partial pressure of oxygen. The outputs of the sensors are digitized, preprocessed by a small onboard computer, and then sent wirelessly to a desktop computer, where the collected data are analyzed and displayed. In addition to the raw data on temperature, pressure, flow, and mole fractions of oxygen and carbon dioxide, the display can include volumetric oxygen consumption, volumetric carbon dioxide production, respiratory equivalent ratio, and volumetric flow rate of exhaled gas

    Portable Unit for Metabolic Analysis

    Get PDF
    The Portable Unit for Metabolic Analysis measures human metabolic function. The compact invention attaches to the face of a subject and it is able to record highly time-resolved measurements of air temperature and pressure, flow rates during inhalation and exhalation, and oxygen and carbon dioxide partial pressure. The device is capable of `breath-by-breath` analysis and `within-breath` analysis at high temporal resolution

    Effects of Anacetrapib in Patients with Atherosclerotic Vascular Disease

    Get PDF
    BACKGROUND: Patients with atherosclerotic vascular disease remain at high risk for cardiovascular events despite effective statin-based treatment of low-density lipoprotein (LDL) cholesterol levels. The inhibition of cholesteryl ester transfer protein (CETP) by anacetrapib reduces LDL cholesterol levels and increases high-density lipoprotein (HDL) cholesterol levels. However, trials of other CETP inhibitors have shown neutral or adverse effects on cardiovascular outcomes. METHODS: We conducted a randomized, double-blind, placebo-controlled trial involving 30,449 adults with atherosclerotic vascular disease who were receiving intensive atorvastatin therapy and who had a mean LDL cholesterol level of 61 mg per deciliter (1.58 mmol per liter), a mean non-HDL cholesterol level of 92 mg per deciliter (2.38 mmol per liter), and a mean HDL cholesterol level of 40 mg per deciliter (1.03 mmol per liter). The patients were assigned to receive either 100 mg of anacetrapib once daily (15,225 patients) or matching placebo (15,224 patients). The primary outcome was the first major coronary event, a composite of coronary death, myocardial infarction, or coronary revascularization. RESULTS: During the median follow-up period of 4.1 years, the primary outcome occurred in significantly fewer patients in the anacetrapib group than in the placebo group (1640 of 15,225 patients [10.8%] vs. 1803 of 15,224 patients [11.8%]; rate ratio, 0.91; 95% confidence interval, 0.85 to 0.97; P=0.004). The relative difference in risk was similar across multiple prespecified subgroups. At the trial midpoint, the mean level of HDL cholesterol was higher by 43 mg per deciliter (1.12 mmol per liter) in the anacetrapib group than in the placebo group (a relative difference of 104%), and the mean level of non-HDL cholesterol was lower by 17 mg per deciliter (0.44 mmol per liter), a relative difference of -18%. There were no significant between-group differences in the risk of death, cancer, or other serious adverse events. CONCLUSIONS: Among patients with atherosclerotic vascular disease who were receiving intensive statin therapy, the use of anacetrapib resulted in a lower incidence of major coronary events than the use of placebo. (Funded by Merck and others; Current Controlled Trials number, ISRCTN48678192 ; ClinicalTrials.gov number, NCT01252953 ; and EudraCT number, 2010-023467-18 .)
    corecore